Efficient convolutional networks learning through irregular convolutional kernels

نویسندگان

چکیده

As deep neural networks are increasingly used in applications suited for low-power devices, a fundamental dilemma emerges: the trend is to develop models use increasing amount of data, resulting memory-intensive models; however, devices have very limited memory and cannot store large models. Parameters pruning critical model deployment on devices. Existing efforts mainly focus designing highly efficient structures or redundant connections networks. They typically sensitive tasks rely dedicated expensive hashing storage strategies. In this work, we introduce novel approach achieve lightweight from perspective reconstructing structure convolution kernels storage. Our transforms traditional square kernel into line segments, automatically learns proper strategy equipping these segments diverse features. Experimental results show that our can significantly reduce number parameters (pruned 69% DenseNet-40) calculation costs 59% while maintaining acceptable performance (only lose less than 2% accuracy).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Efficient Convolutional Networks through Network Slimming

Layer Width Width* Pruned P/F Pruned 1 64 22 65.6% 34.4% 2 64 62 3.1% 66.7% 3 128 83 35.2% 37.2% 4 128 119 7.0% 39.7% 5 256 193 24.6% 29.9% 6 256 168 34.4% 50.5% 7 256 85 66.8% 78.2% 8 256 40 84.4% 94.8% 9 512 32 93.8% 99.0% 10 512 32 93.8% 99.6% 11 512 32 93.8% 99.6% 12 512 32 93.8% 99.6% 13 512 32 93.8% 99.6% 14 512 32 93.8% 99.6% 15 512 32 93.8% 99.6% 16 512 38 92.6% 99.6% Total 5504 1034 81...

متن کامل

Irregular Convolutional Neural Networks

Convolutional kernels are basic and vital components of deep Convolutional Neural Networks (CNN). In this paper, we equip convolutional kernels with shape attributes to generate the deep Irregular Convolutional Neural Networks (ICNN). Compared to traditional CNN applying regular convolutional kernels like 3× 3, our approach trains irregular kernel shapes to better fit the geometric variations o...

متن کامل

Learning Non-overlapping Convolutional Neural Networks with Multiple Kernels

In this paper, we consider parameter recovery for non-overlapping convolutional neural networks (CNNs) with multiple kernels. We show that when the inputs follow Gaussian distribution and the sample size is sufficiently large, the squared loss of such CNNs is locally strongly convex in a basin of attraction near the global optima for most popular activation functions, like ReLU, Leaky ReLU, Squ...

متن کامل

Deep Clustered Convolutional Kernels

Deep neural networks have recently achieved state of the art performance thanks to new training algorithms for rapid parameter estimation and new regularizations to reduce overfitting. However, in practice the network architecture has to be manually set by domain experts, generally by a costly trial and error procedure, which often accounts for a large portion of the final system performance. W...

متن کامل

Pruning Convolutional Neural Networks for Resource Efficient Transfer Learning

We propose a new framework for pruning convolutional kernels in neural networks to enable efficient inference, focusing on transfer learning where large and potentially unwieldy pretrained networks are adapted to specialized tasks. We interleave greedy criteria-based pruning with fine-tuning by backpropagation—a computationally efficient procedure that maintains good generalization in the prune...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2022.02.065